Interquantile shrinkage and variable selection in quantile regression
نویسندگان
چکیده
Examination of multiple conditional quantile functions provides a comprehensive view of the relationship between the response and covariates. In situations where quantile slope coefficients share some common features, estimation efficiency and model interpretability can be improved by utilizing such commonality across quantiles. Furthermore, elimination of irrelevant predictors will also aid in estimation and interpretation. These motivations lead to the development of two penalization methods, which can identify the interquantile commonality and nonzero quantile coefficients simultaneously. The developed methods are based on a fused penalty that encourages sparsity of both quantile coefficients and interquantile slope differences. The oracle properties of the proposed penalization methods are established. Through numerical investigations, it is demonstrated that the proposed methods lead to simpler model structure and higher estimation efficiency than the traditional quantile regression estimation.
منابع مشابه
Interquantile Shrinkage in Regression Models.
Conventional analysis using quantile regression typically focuses on fitting the regression model at different quantiles separately. However, in situations where the quantile coefficients share some common feature, joint modeling of multiple quantiles to accommodate the commonality often leads to more efficient estimation. One example of common features is that a predictor may have a constant e...
متن کاملVariable Selection in Nonparametric and Semiparametric Regression Models
This chapter reviews the literature on variable selection in nonparametric and semiparametric regression models via shrinkage. We highlight recent developments on simultaneous variable selection and estimation through the methods of least absolute shrinkage and selection operator (Lasso), smoothly clipped absolute deviation (SCAD) or their variants, but restrict our attention to nonparametric a...
متن کاملModel selection in quantile regression models
Lasso methods are regularization and shrinkage methods widely used for subset selection and estimation in regression problems. From a Bayesian perspective, the Lasso-type estimate can be viewed as a Bayesian posterior mode when specifying independent Laplace prior distributions for the coefficients of independent variables (Park and Casella, 2008). A scale mixture of normal priors can also prov...
متن کاملVariable Selection for Nonparametric Quantile Regression via Smoothing Spline AN OVA.
Quantile regression provides a more thorough view of the effect of covariates on a response. Nonparametric quantile regression has become a viable alternative to avoid restrictive parametric assumption. The problem of variable selection for quantile regression is challenging, since important variables can influence various quantiles in different ways. We tackle the problem via regularization in...
متن کاملEditorial for the special issue on quantile regression and semiparametric methods
Quantile regression and other semiparametric models have been widely recognized as important data analysis tools in statistics and econometrics. Thesemethods donot rely strictly onparametric likelihoodbut avoid the curse of dimensionality associated with many nonparametric models. The journal Computational Statistics and Data Analysis regularly publishes papers on these semiparametric methods, ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- Computational statistics & data analysis
دوره 69 شماره
صفحات -
تاریخ انتشار 2014